The metric property of the quantum Jensen-Shannon divergence
نویسندگان
چکیده
In this short note, we prove that the square root of quantum Jensen-Shannon divergence is a true metric on cone positive matrices, and hence in particular state space.
منابع مشابه
Metric character of the quantum Jensen-Shannon divergence
P. W. Lamberti, A. P. Majtey, A. Borras, M. Casas, and A. Plastino Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba, and CONICET, C.C. 727, La Plata 1900, Argentina Departament de Física and IFISC, Universitat de les Illes Balears, 07122 Palma de Mallorca, Spain Instituto de Física La Plata, Universidad Nacional de La Plata and CON...
متن کاملManifold Learning and the Quantum Jensen-Shannon Divergence Kernel
The quantum Jensen-Shannon divergence kernel [1] was recently introduced in the context of unattributed graphs where it was shown to outperform several commonly used alternatives. In this paper, we study the separability properties of this kernel and we propose a way to compute a low-dimensional kernel embedding where the separation of the different classes is enhanced. The idea stems from the ...
متن کاملAttributed Graph Similarity from the Quantum Jensen-Shannon Divergence
One of the most fundamental problem that we face in the graph domain is that of establishing the similarity, or alternatively the distance, between graphs. In this paper, we address the problem of measuring the similarity between attributed graphs. In particular, we propose a novel way to measure the similarity through the evolution of a continuous-time quantum walk. Given a pair of graphs, we ...
متن کاملNonextensive Generalizations of the Jensen-Shannon Divergence
Convexity is a key concept in information theory, namely via the many implications of Jensen’s inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen’s inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building bloc...
متن کاملNon-parametric Jensen-Shannon Divergence
Quantifying the difference between two distributions is a common problem in many machine learning and data mining tasks. What is also common in many tasks is that we only have empirical data. That is, we do not know the true distributions nor their form, and hence, before we can measure their divergence we first need to assume a distribution or perform estimation. For exploratory purposes this ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Advances in Mathematics
سال: 2021
ISSN: ['1857-8365', '1857-8438']
DOI: https://doi.org/10.1016/j.aim.2021.107595